“Significant concerns” raised over impact of data center growth on regional energy grids
Researchers say they have “significant concerns” about the effect rapid growth in data center electricity demand is having on regional energy networks.
Schneider Electric has published an investigation into the possible future of AI’s electricity consumption over the next decade, using system dynamics modeling to forecast scenarios for AI electricity demand in the future.
‘The breadth of AI’s projected electricity demand encompasses a wide array of interconnected issues. These issues range from infrastructure challenges to supply chain disruptions and socio-economic concerns,” the report stated.
It added that while data center electricity use only accounts for 0.3% of worldwide electricity demand at the moment, rapid growth is anticipated globally.
For example, the International Energy Agency (IEA) projected electricity consumption associated with global data center demand to more than double between 2022 and 2026, peaking to over 1,000 TWh in 2026, which is equivalent to Japan’s national electricity consumption.
The report noted that effects of this explosion in demand will not be distributed evenly however, raising “significant concerns about regional grid stress and power capacity”.
In the US, just 15 states are responsible for hosting 80% of the country’s data centers. In northern Virginia’s ‘Data Center Alley’, sites consume roughly 25% of the region’s electricity, which could rise to 50% in a high-growth scenario.
It’s a similar story in Ireland, where data centers could consume up to 32% of the nation’s electricity by 2026, according to another IEA study, which would exceed the combined energy consumption of all urban homes in 2023.
AI expansion may be limited due to both human and natural constraints
The report’s authors, Rémi Paccou, director of the sustainability research institute at Schneider Electric, and Professor Fons Wijnhoven, associate professor at the University of Twente, sketched out four potential scenarios of AI electricity consumption over the next decade.
In all but one of these scenarios, Paccou and Wijnhoven warn of inequality in terms of the distribution of AI inferencing capabilities as well as energy challenges.
The first scenario is one in which AI-driven advancements in energy efficiency and resource optimization result in substantial improvements in data center operations.
“A symbiotic cycle emerges between AI and the new energy system, where AI enhances system efficiency through renewable supply, demand-side electrification, and grid management, which in turn powers more sustainable AI development,” the report describes.
The key findings of the analysis of this potential outcome were that generative AI inferencing will emerge as the dominant electricity consumer, but traditional AI will play a central role in decarbonization efforts.
Additionally, resource-conscious approaches to generative AI training will exhibit an intensified focus on using less energy-intensive models.
Scenario two describes a scenario where AI expansion is hindered by natural or encounter natural or human-linked constraints. These hindrances include power availability, data scarcity, material and mineral shortages, computational resources, as well as regulatory restrictions.
Paccou and Wijnhoven found that this scenario also implies potential regional disparities in their AI inferencing capabilities.
“Areas with more robust power infrastructure and cooler climates may have advantages in scaling their AI operations, potentially leading to geographical concentrations of AI inference capabilities.”
Uncoordinated AI governance could lead to localized energy crunches
The third scenario is one in which improvements in AI efficiency actually induce an increase in overall energy consumption as firms push for widescale AI implementation across all sectors.
The subsequent “unbounded growth” competes with natural constraints, and could exacerbate AI access inequality and e-waste issues by prioritizing performance over practicality.
Finally, the fourth scenario is labeled the ‘AI Energy Crisis’, in which rapid growth of AI leads to an “unforeseen energy crisis where the technology’s electricity demand begins to conflict with other critical sectors of the economy.”
“In this scenario, very high exogenous constraints are put on local grid capacities as the compound annual growth rate (CAGR) of data center energy consumption might reach staggering levels, potentially surpassing 25% in some regions,” the report stated.
In this scenario, uncoordinated AI governance is projected to lead to disunified regulation and localized blackouts in high demand areas.
The report added that piling additional resources into new areas like synthetic data and multimodal learning will likely intensify local energy crunches associated with AI training.
“The geographical implications of these trends could lead to a concentration of AI development in areas with access to both vast amounts of data and energy, potentially exacerbating local energy challenges in these regions.”
Source link